Image restoration by minimizing objective functions with nonsmooth data-£delity terms
نویسنده
چکیده
We present a theoretical study of the recovery of images x from noisy data y by minimizing a regularized cost-function F(x, y) = Ψ(x, y) + αΦ(x), where Ψ is a data-£delity term, Φ is a smooth regularization term and α > 0 is a parameter. Generally, Ψ is a smooth function; only a few papers make an exception. Non-smooth data-£delity terms are avoided in image processing. In spite of this, we consider both smooth and non-smooth data-£delity terms. Our ambition is to catch essential features exhibited by the local minimizers of F in relation with the smoothness of Ψ. We focus on Ψ(x, y) = ∑ i ψ(a T i x−yi) where a T i are linear operators and ψ is C-smooth on IR\{0}. We show that if ψ(0) < ψ(0), typical data y lead to local minimizers x̂ of F(., y) which £t exactly part of the data entries: there is a possibly large set ĥ such that ai x̂= yi for every i∈ ĥ. In contrast, if ψ is smooth on IR, for almost every y, the local minimizers of F(., y) do not £t any entry of y. Cost-functions with non-smooth data-£delity exhibit a strong mathematical property which can be used in various ways. We then construct a cost-function allowing aberrant data to be detected and selectively smoothed. The obtained results advocate the use of non-smooth data-£delity terms.
منابع مشابه
A Class of Nonconvex Nonsmooth Approximate Potential Functions for Nonconvex Nonsmooth Image Restoration
Nonconvex nonsmooth potential functions have superior restoration performance for the images with neat boundaries. However, several difficulties emerge from the numerical computation. Thus the graduated nonconvex (GNC) method is suggested to deal with these problems. To improve the performance of the GNC method further, a class of nonconvex nonsmooth approximate potential functions have been co...
متن کاملOn Sequential Optimality Conditions without Constraint Qualifications for Nonlinear Programming with Nonsmooth Convex Objective Functions
Sequential optimality conditions provide adequate theoretical tools to justify stopping criteria for nonlinear programming solvers. Here, nonsmooth approximate gradient projection and complementary approximate Karush-Kuhn-Tucker conditions are presented. These sequential optimality conditions are satisfied by local minimizers of optimization problems independently of the fulfillment of constrai...
متن کاملSmoothing Nonlinear Conjugate Gradient Method for Image Restoration Using Nonsmooth Nonconvex Minimization
Image restoration problems are often converted into large-scale, nonsmooth and nonconvex optimization problems. Most existing minimization methods are not efficient for solving such problems. It is well-known that nonlinear conjugate gradient methods are preferred to solve large-scale smooth optimization problems due to their simplicity, low storage, practical computation efficiency and nice co...
متن کاملA Posteriori Restoration of Block Transform-Compressed Data
Block transform-compression techniques such as JPEG operate by systematically decomposing image data into segments or blocks that are transformed, quantized, and encoded independently. In this work we demonstrate a method for classifying images and applying combinations of existing quantization noise mitigation techniques that results in signiicant objective and subjective improvement in image ...
متن کاملAn Iterative Linear Expansion of Thresholds for 퓁 1 -Based Image Restoration
This paper proposes a novel algorithmic framework to solve image restoration problems under sparsity assumptions. As usual, the reconstructed image is the minimum of an objective functional that consists of a data fidelity term and an ℓ₁ regularization. However, instead of estimating the reconstructed image that minimizes the objective functional directly, we focus on the restoration process th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2001